# TPU efficient training
Llama 3 Open Ko 8B Instruct Preview
Other
A Korean language model based on continued pre-training of Llama-3-8B, trained on 60GB+ deduplicated publicly available text, supporting Korean and English.
Large Language Model
Transformers Supports Multiple Languages

L
beomi
6,014
60
Roberta Swedish
This is a Swedish pre-trained model based on the RoBERTa architecture, suitable for various natural language processing tasks.
Large Language Model
R
birgermoell
54
0
Featured Recommended AI Models